510 research outputs found
A Contribution to the Defense of Liquid Democracy
Liquid democracy is a hybrid direct-representative decision making process
that provides each voter with the option of either voting directly or to
delegate their vote to another voter, i.e., to a representative of their
choice. One of the proposed advantages of liquid democracy is that, in general,
it is assumed that voters will delegate their vote to others that are better
informed, which leads to more informed and better decisions. Considering an
audience from various knowledge domains, we provide an accessible high-level
analysis of a prominent critique of liquid democracy by Caragiannis and Micha.
Caragiannis and Micha's critique contains three central topics: 1. Analysis
using their -delegation model, which does not assume delegation to the
more informed; 2. Novel delegation network structures where it is advantageous
to delegate to the less informed rather than the more informed; and 3. Due to
NP hardness, the implied impracticability of a social network obtaining an
optimal delegation structure. We show that in the real world, Caragiannis and
Micha's critique of liquid democracy has little or no relevance. Respectively,
our critique is based on: 1. The identification of incorrect
-delegation model assumptions; 2. A lack of novel delegation structures
and their effect in a real-world implementation of liquid democracy, which
would be guaranteed with constraints that sensibly distribute voting power; and
3. The irrelevance of an optimal delegation structure if the correct result is
guaranteed regardless. We conclude that Caragiannis and Micha's critique has no
significant negative relevance to the proposition of liquid democracy
A formal framework for the specification of interactive systems
We are primarily concerned with interactive systems whose behaviour is highly reliant on end
user activity. A framework for describing and synthesising such systems is developed. This
consists of a functional description of the capabilities of a system together with a means of
expressing its desired 'usability'. Previous work in this area has concentrated on capturing
'usability properties' in discrete mathematical models.
We propose notations for describing systems in a 'requirements' style and a 'specification'
style. The requirements style is based on a simple temporal logic and the specification style is
based on Lamport's Temporal Logic of Actions (TLA) [74]. System functionality is specified as
a collection of 'reactions', the temporal composition of which define the behaviour of the system.
By observing and analysing interactions it is possible to determine how 'well' a user performs
a given task. We argue that a 'usable' system is one that encourages users to perform their tasks
efficiently (i.e. to consistently perform their tasks well) hence a system in which users perform
their tasks well in a consistent manner is likely to be a usable system.
The use of a given functionality linked with different user interfaces then gives a means
by which interfaces (and other aspects) can be compared and suggests how they might be
harnessed to bias system use so as to encourage the desired user behaviour. Normalising across
different users anq different tasks moves us away from the discrete nature of reactions and
hence to comfortably describe the use of a system we employ probabilistic rather than discrete
mathematics.
We illustrate that framework with worked examples and propose an agenda for further work
Recommended from our members
A study of the information needs of the users of a folk music library and the implications for the design of a digital library system
A qualitative study of user information needs is reported, based on a purposive sample of users and potential users of the Vaughan Williams Memorial Library, a small specialist folk music library in North London. The study set out to establish what the user’s (both existing and potential) information needs are, so that the library’s online service may take them into account with its design. The information needs framework proposed by Nicholas (2000) is used as an analytical tool to achieve this end. The demographics of the users were examined in order to establish four user groups: Performer, Academic, Professional and Enthusiast. Important information needs were found to be based on social interaction, and key resources of the library were its staff, the concentration of the collection and the library’s social nature. A collection of broad design requirements are proposed based on the analysis and this study also provided some insights into the issue of musical relevance, which are discussed
Deriving prevalence estimates of depressive symptoms throughout middle and old age in those living in the community
BACKGROUND: There is considerable debate about the prevalence of depression in old age. Epidemiological surveys and clinical studies indicate mixed evidence for the association between depression and increasing age. We examined the prevalence of probable depression in the middle aged to the oldest old in a project designed specifically to investigate the aging process. METHODS: Community-living participants were drawn from several Australian longitudinal studies of aging that contributed to the Dynamic Analyses to Optimise Ageing (DYNOPTA) project. Different depression scales from the contributing studies were harmonized to create a binary variable that reflected "probable depression" based on existing cut-points for each harmonized scale. Weighted prevalence was benchmarked to the Australian population which could be compared with findings from the 1997 and 2007 National Surveys of Mental Health and Well-Being (NSMHWB). RESULTS: In the DYNOPTA project, females were more likely to report probable depression. This was consistent across age levels. Both NSMHWB surveys and DYNOPTA did not report a decline in the likelihood of reporting probable depression for the oldest old in comparison with mid-life. CONCLUSIONS: Inconsistency in the reports of late-life depression prevalence in previous epidemiological studies may be explained by either the exclusion and/or limited sampling of the oldest old. DYNOPTA addresses these limitations and the results indicated no change in the likelihood of reporting depression with increasing age. Further research should extend these findings to examine within-person change in a longitudinal context and control for health covariates.NHMRC (National Health and Medical Research Council of Australia
Refining the definition of mandibular osteoradionecrosis in clinical trials: The cancer research UK HOPON trial (Hyperbaric Oxygen for the Prevention of Osteoradionecrosis)
Introduction:Mandibular osteoradionecrosis (ORN) is a common and serious complication of head andneck radiotherapy for which there is little reliable evidence for prevention or treatment. The diagnosisand classification of ORN have been inconsistently and imprecisely defined, even in clinical trials.Methods:A systematic review of diagnosis and classifications of ORN with specific focus on clinical trialsis presented. The most suitable classification was evaluated for consistency using blinded independentreview of outcome data (clinical photographs and radiographs) in the HOPON trial.Results:Of 16 ORN classifications found, only one (Notani) appeared suitable as an endpoint in clinicaltrials. Clinical records of 217 timepoints were analysed amongst 94 randomised patients in theHOPON trial. The only inconsistency in classification arose where minor bone spicules (MBS) were appar-ent, which occurred in 19% of patients. Some trial investigators judged MBS as clinically unimportant andnot reflecting ORN, others classified as ORN based on rigid definitions in common clinical use. When MBSwas added as a distinct category to the Notani classification this ambiguity was resolved and agreementbetween observers was achieved.Discussion:Most definitions and clinical classifications are based on retrospective case series and may beunsuitable for prospective interventional trials of ORN prevention or treatment. When ORN is used as aprimary or secondary outcome in prospective clinical trials, the use of Notani classification with the addi-tional category of MBS is recommended as it avoids subjectivity and enhances reliability and consistencyof reporting
Mobile satellite propagation measurements and modeling: A review of results for systems engineers
An overview of Mobile Satellite Service (MSS) propagation measurements and modeling is intended as a summary of current results. While such research is on-going, the simple models presented here should be useful to systems engineers. A complete summary of propagation experiments with literature references is also included
A contribution to the defense of liquid democracy
Liquid democracy is a hybrid direct-representative decision making process
that provides each voter with the option of either voting directly or to
delegate their vote to another voter, i.e., to a representative of their
choice. One of the proposed advantages of liquid democracy is that, in general,
it is assumed that voters will delegate their vote to others that are better
informed, which leads to more informed and better decisions. Considering an
audience from various knowledge domains, we provide an accessible high-level
analysis of a prominent critique of liquid democracy by Caragiannis and Micha.
Caragiannis and Micha's critique contains three central topics: 1. Analysis
using their -delegation model, which does not assume delegation to the
more informed; 2. Novel delegation network structures where it is advantageous
to delegate to the less informed rather than the more informed; and 3. Due to
NP hardness, the implied impracticability of a social network obtaining an
optimal delegation structure. We show that in the real world, Caragiannis and
Micha's critique of liquid democracy has little or no relevance. Respectively,
our critique is based on: 1. The identification of incorrect
-delegation model assumptions; 2. A lack of novel delegation structures
and their effect in a real-world implementation of liquid democracy, which
would be guaranteed with constraints that sensibly distribute voting power; and
3. The irrelevance of an optimal delegation structure if the correct result is
guaranteed regardless. We conclude that Caragiannis and Micha's critique has no
significant negative relevance to the proposition of liquid democracy
Testing the Scalar Triplet Solution to CDF's Fat Problem at the LHC
The Type II Seesaw model remains a popular and viable explanation of neutrino
masses and mixing angles. By hypothesizing the existence of a scalar that is a
triplet under the weak gauge interaction, the model predicts strong
correlations among neutrino oscillation parameters, signals at lepton flavor
experiments, and collider observables at high energies. We investigate reports
that the Type II Seesaw can naturally accommodate recent measurements by the
CDF collaboration, which finds the mass of the boson to be significantly
larger than allowed by electroweak precision data, while simultaneously evading
constraints from direct searches. Experimental scrutiny of this parameter space
in the Type II Seesaw has long been evaded since it is not characterized by
``golden channels'' at colliders but instead by cascade decays, moderate mass
splittings, and many soft final states. In this work, we test this parameter
space against publicly released measurements made at the Large Hadron Collider.
By employing a newly developed tool chain combining MadGraph5\_aMC@NLO and
Contur, we find that most of the favored space for this discrepancy is already
excluded by measurements of Standard Model final states. We give suggestions
for further exploration at Run III of the LHC, which is now underway.Comment: 8 pages (incl. refs.), 3 figures; minor clarifications, matches
published versio
Testing the scalar triplet solution to CDF’s heavy W problem at the LHC
The type II seesaw model remains a popular and viable explanation of neutrino masses and mixing
angles. By hypothesizing the existence of a scalar that is a triplet under the weak gauge interaction, the
model predicts strong correlations among neutrino oscillation parameters, signals at lepton flavor
experiments, and collider observables at high energies. We investigate reports that the type II seesaw
can naturally accommodate recent measurements by the CDF collaboration, which finds the mass of the W
boson to be significantly larger than allowed by electroweak precision data, while simultaneously evading
constraints from direct searches. Experimental scrutiny of this parameter space in the type II seesaw has
long been evaded since it is not characterized by “golden channels” at colliders but instead by cascade
decays, moderate mass splittings, and many soft final states. In this work, we test this parameter space
against publicly released measurements made at the Large Hadron Collider. By employing a newly
developed tool chain combining MadGraph5_AMC@NLO and CONTUR, we find that most of the favored space
for this discrepancy is already excluded by measurements of Standard Model final states. We give
suggestions for further exploration at run III of the LHC, which is now under way
- …